Generalized Sparse Learning of Linear Models Over the Complete Subgraph Feature Set
نویسندگان
چکیده
منابع مشابه
Bayesian Inference for Sparse Generalized Linear Models
We present a framework for efficient, accurate approximate Bayesian inference in generalized linear models (GLMs), based on the expectation propagation (EP) technique. The parameters can be endowed with a factorizing prior distribution, encoding properties such as sparsity or non-negativity. The central role of posterior log-concavity in Bayesian GLMs is emphasized and related to stability issu...
متن کاملRobust Estimation in Linear Regression with Molticollinearity and Sparse Models
One of the factors affecting the statistical analysis of the data is the presence of outliers. The methods which are not affected by the outliers are called robust methods. Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers. Besides outliers, the linear dependency of regressor variables, which is called multicollinearity...
متن کاملLearning hybrid linear models via sparse recovery
We introduce new methods to tackle the problem of hybrid linear learning—learning the number and dimensions of the subspaces present in a collection of high-dimensional data and then determining a basis or overcomplete dictionary that spans each of the subspaces. To do this, we pose this problem as the estimation of a set of points on the Grassmanian manifold G(k, n), i.e., the collection of al...
متن کاملOn the Sparse Bayesian Learning of Linear Models
This work is a re-examination of the sparse Bayesian learning (SBL) of linear regression models of Tipping (2001) in a high-dimensional setting with a sparse signal. We show that in general the SBL estimator does not recover the sparsity structure of the signal. To remedy this, we propose a hard-thresholded version of the SBL estimator that achieves, for orthogonal design matrices, the nonasymp...
متن کاملEfficient active learning with generalized linear models
Active learning can significantly reduce the amount of training data required to fit parametric statistical models for supervised learning tasks. Here we present an efficient algorithm for choosing the optimal (most informative) query when the output labels are related to the inputs by a generalized linear model (GLM). The algorithm is based on a Laplace approximation of the posterior distribut...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Pattern Analysis and Machine Intelligence
سال: 2017
ISSN: 0162-8828,2160-9292
DOI: 10.1109/tpami.2016.2567399